Added Vercel AI Gateway as a provider#11
Conversation
|
@jerilynzheng Thanks! Can you just check one thing for me -- someone noticed earlier that with Prime Intellect's inference on the OAI API, there was some mismatch for cost tracking. Can you just check that with Vercel (I don't have an API key so I can't test this) that cost tracking works (e.g. when you run the RLM example, can you print the RLMChatCompletion object and send a screenshot here? Just make sure that the costs it reports are correct). |
|
Hey @alexzhang13 are you thinking of using dspy for managing llm inference given it already has integration with the different provider through litellm |
Haven't though of this yet, could be convinced to add it though. |
|
HI @alexzhang13 , sorry for the delay! Here's the screenshot |

Summary
Added Vercel AI Gateway as a provider using OpenAI-compatible API
Changes